Abstract: Effective Human Computer Intelligent Interaction (HCII) requires the information about the user’s identity, state and intent which can be extracted from images, so that computers can then react accordingly, e.g. systems behaving according to the emotional state of the person. The most expressive way humans display emotions is through facial expressions. In this paper we have presented the technique of recognizing emotion using facial expressions which is of prime focus in human interactions. By creating machines that can detect and understand emotion, we can enhance the human computer interaction. In this paper, we have discussed a framework for the classification of emotional states, based on images of the face and the implementation details of a real-time facial feature extraction and emotion recognition application. This application automatically detects faces from the captured images and codes them with respect to 7 dimensions in real time: Neutral, Anger, Disgust, Fear, Smile, Sadness, and Surprise. Most interestingly the outputs of the classifier change smoothly as a function of time, providing a possibly worth representation of code facial expression dynamics in a fully automatic and unnoticeable manner. The main objective of our work is the real-time implementation of a facial emotion recognition system.

 

Keywords: Real Time, Emotion Recognition, Human Computer Interaction (HCI), Feed-Forward Neural Network, Largest Connected Region (LCR), Cubic Bezier Curve.